perm filename PLAYER[P,JRA]6 blob sn#445057 filedate 1979-05-30 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00028 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	byte lisp setup
C00008 00003
C00010 00004
C00011 00005
C00018 00006
C00022 00007
C00024 00008
C00025 00009
C00026 00010
C00027 00011	   
C00031 00012
C00032 00013
C00033 00014
C00047 00015
C00050 00016
C00053 00017
C00056 00018
C00059 00019
C00062 00020
C00064 00021
C00065 00022
C00092 00023
C00093 00024
C00097 00025
C00100 00026
C00109 00027
C00112 00028
C00113 ENDMK
C⊗;
byte lisp setup

J allen  		general introduction in spirit of anatomy condensed

G prini  		implementation of scheme system for Z-80 system

David Stoutmeyer 	8080 lisp with applications to macsyma
	they have a small lisp running mainly to support
	an algebraic manipulation system for high school kids. quite impressive,
	has bignums but no funargs - shallow binding. they were running with 48k.
 	he was very interested
	in 8080 vlisp as he didn't know about it. Dr. Dobbs were here chasing hin up
	for an article about his system (will distribute for about $70in a month or so).
	i told him about your attempts to round up some good lisp articles for a special
	issue of byte and he told me he would contact you about it either by phone
	or over the net. he seemed to prefer appearing in your lisp special to Dr. Dobbs.
	he is writing a book on algebraic manipulation.   peter

	PETER MILNE TOLD ME THAT YOU ARE EDITING A SPECIAL ISSUE OF
	BYTE, DEVOTED TO LISP.  A COLLEAGUE OF MINE, AL RICH,
	IMPLEMENTED A TINY LISP FOR THE 8080.  IS IT TOO LATE
	FOR HIM TO SUBMIT AN ARTICLE?  IF NOT, COULD  YOU TELL ME THE DEADLINES AND
	THE EDITORIAL GUIDELINES?

William Kornfeld  	pattern directed invocation
	I am finishing up a master's thsis on a pattern directed invocation
	system implemented on top of Lisp (ala Planner, Conniver, etc.).
	These languages make heavy use of list structure and would be a
	good example of the extensibility of Lisp.  I would be interested
	in writing an article on what pattern directed languages are, what
	what they are used for, and how they can be implemented in Lisp.

Phil Agre		functions in lisp --phil and hacks
	       I am in the process of preparing an article on the implementation
	of functions in LISP, half philosophy-of-LISP, half assembly-language
	coding hacks, which comes in 20- and 50- double-spaced-page versions.
	It will appear this month (w/ luck) as a Univ. of Maryland Technical
	Report under the title "Functions as Data Objects -- The Implementation
	of Functions in LISP".  Would you be interested in receiving a copy
	of this report with an eye towards publishing it or a version of it
	(e.g., the philosophy half or the hacking half) in the August '79
	LISP issue of BYTE?  

Vaughan Pratt		theoretician looks at lisp
	For Byte I would be willing to talk (a) about the contribution made to
	LISP by mathematical logic and realted stuff (b) how syntax is a red
	herring to some extent in that LISP syntax and LISP semantics can
	be cleanly separated and the latter is more interesting, (c) what
	a theorist like me finds useful about LISP in practical programming,
	and/or (d) what features of LISP and APL could be elegantly combined
	to form a more winning language.  I'm unqualified to talk about
	history, and not particularly anxious to talk about implementation
	issues - how to do it, who's done it, what they did, etc.


H. Baker		garbage collection
	I'd love to submit an article on (what else?) garbage collection!
	What's the scoop?

r zippel, inc		architecture
	This is to inform you that some subset of us will be submitting a
	paper for the August issue of BYTE.  The title will probably be
	something like "LIL - A Lisp Implementation Language". We will
	present a Lisp based compiler for implementing system software.
	With the language (compiler and perhaps interpreter) embedded in
	a Lisp-like environment, the user will have access to the Lisp
	debugging tools.  We feel that the ability to manipulate programs
	via macros coded in Lisp will speed the code writing phase.  This
	language should be usable on micro-processors without difficulty.


efficiency


ach
gls
rg?


lisp
 spaces
  atoms
     plist[pname, value cell, doc pointer, property list]

  numbers
  lists    there are no dotted pairs! (simplifies reader and printer and user)
  strings
  arrays (?)
  primitive/compiled
  frames(?)

the interpreter is a finite state machine representation of a non-recursive
eval; it will inteface closely with the debugger. 

there will be only ONE value cell with an atom (no cval-fval crap).
a version of linkers will be used.

No file system per se; all pseudo virtual memory, with fasload-like
elements on the property lists of functions. 

a caching scheme for migrating LRU functions out of lisp space and onto
the disc is a compact format; perhaps something better than print-to-disc.

if time and energy permit, consideration will be given to arrays and strings.

two-level storage paper

compiler
 generate psuedo-code for porting: lisp to L-code
 assembler and fasload
  small L-code interpreter

A "cheap" compiler which generates an intermediate code should be fairly
straightforward to design. The code will be interpreted by a small
overlayable interpreter.

should it be clever about tail recursion?

what about threaded code?

note: no need to swap OUT compiled code since it is read/execute only.

what about block compilation?


editor
 screen oriented
  two-dimensional ellipses
  ()-blinkers
  structure sensitive
  output driven by types

RANDOM EDITOR NOTES
fundamental law: at any instant the structure is a well formed list!!!

screen is defined as a two-dimensional window on a piece of list structure.
Actually the editing operations take place on a "text" representation
rather than  the internal list stucture. There is a pretty print to the screen
and the edit commands appropriately transform the screen representation
and the internal form.


The pretty printer must be able to ellipsize list structure, both horizontally
and vertically.  

e.g.    (
(def clear_lin (rel_lin wd) (cond ((#) (#)
				  (t (#)))
			    (setq z #)
			    . . .
			    (while z>0
				(do (set (V z) chr)
				    . . .
				    (decr z1)))

alternative:  perhaps minsky's ideas will work here??

with these ellipses comes two kinds of positioning commands (and their inverses)

1) magnify (shrink) a "#-node"
    magnify works by pointing at desired node and smashing approriate key/button
    that expression takes over the screen;  shrink reverses the process,
    repretty printing the predecessor structure; magnifys can be nested
    arbitrarily.

2) scroll to  move a "...-node"

3) the should also be an explicit "repretty print the screen" command
   because editing will modify nesting relationships. this reformat should
   NOT be automatic because can dractically modify the screen. the effect
   can be visually very disconcerting. However, in insert mode, a LOCAL
   pp may be useful to make structure clearer.  (compare auto-justify in word
   processors: perhaps it's ok to make on-the-fly line padding  at
   end of lines when text is being entered; but wrong to jiggle the text
   during deletion or insertion in existing text)


ordinary typing:  
  text mode
	there is some difference of opinion whether the default key mode
	is "insert" or "replace"; requires experimentation.

	typing a "(" supplies a gratituous ")" and puts the editor in
	insert mode between the parens.

	typing a ")" blinks the corresponding left paren.

  delete mode:chrs
	deleting "(" or ")" dumps it and its corresponding paren
        deleting a chr, it goes away

  delete mode: words
        deleting parens deletes the whole expression
        deleting a chr, deletes the containing atom

  insert mode:
        inserting a "(" does it and generates an attached ")"; that
	right paren must be deposited to the right of the left paren
	and at the same nesting level as the left paren. cannot
	leave attach mode until paren is properly dropped (of course,
	attach mode can be aborted and "(" goes away)

        e.g. in ((A B C (F G) 5) 4), if we insert a left paren before the B
	the right paren cannot be dropped inside (F G) or outside 5)

	similarly for inserting ")", except direction is to the left

	ordinary characters just go in

cursor motion: 
	character node (by chrs) 
	structure mode (by structure)

attach mode:
	either attach text in reverse video for small crap
	or open new window for massive cutting, copying, and pasting


The real whiz is structured editing particularly for program construction.
The idea is the program by "selection", typically selection of control 
constructs. For example the "cond-key/button" generates a conditional
template like  (COND (# #) (# #))  and places the cursor at the first #.
the element is now ready for expansion.	here are some suggestions
(in terms of control keys, but mouse buttons and menus are probably better)
	↑C (COND (# #) (# #))
	↑O (T #)   the "otherwise key
	↑W (WHILE # (DO #))
        ↑D (DEF # # #)
	↑* inside a DEF supplies an instance of the DEF's name
	    this is useful to encourage long (meaningful) names
	    like "(DEF GET_RANDOM_USEFUL_PROPERTY    )" without requiring lots
	    of typing.

interesting hack: incremental search (↑Sstring<alt>)
	in search mode, as each character is typed the cursor moves through
	the text until enough characters  are given to describe the desired
	string. back-space will back up the string and the search;
	alt-mode, say, will set the string. ↑s<alt> would find next occurrence

debugger
 screen oriented
  graphic machine description
   finite state lisp machine

____________________________________________________________________________
|									    |
|									    |
|  DEST _________________________ 					    |
|	| v1   | ↔ | e1		|					    |
|	| v2   |   | e2		|					    |
|	|			|	or {exp i}			    |
|	|	. . .		|					    |
|	| vn   |   | en		|					    |
|	-------------------------					    |
|						MAR TABLE		    |
|  ENV  _________________________	________________________________    |
|	| x1	|   v1		|	|				|   |
|	| x2	|   v2		|	| vari: env (condition) function|   |
|	|	. . .		|	|				|   |
|	-------------------------	|				|   |
|					|				|   |
|	_________________________  	|				|   |
|	| y1	|   u1		|	|				|   |
|	| y2	|   u2		|	|				|   |
|	| y3	|   u3		|	|				|   |
|	|	|		|	|				|   |
|	-------------------------	|				|   |
|					---------------------------------   |
|		. . .							    |
|									    |
|									    |
-----------------------------------------------------------------------------

when a λ-binding  is happening the "dest form" of the display is active
with arrow indicating next operation

when {expi} is happening the dest area window is the ellipsized pretty-printed
version of the exp's with an appropriate cursor.

if a "step" occurs during the "dest" phase, dest disappears and the ei takes 
its place; after completion, dest is popped back

after dest is completed, it takes its place on the top of ENV with  the bottom
env block being pushed off the screen if necessary.

The MAR table is a user selected set of variables which can be monitored for 
variable access paths, including environment information as well as "break-within-
function" selection.


some debug commands
	↑S step current expression through one eval cycle
		if expression is fun-app then open that evaluation
		and pause

	↑X execute current expression 
		return to  listen loop when completed

	↑U evaluate until ready to "go up a level"; i.e. return
		to caller

	↑B break at entrance to  expression's evaluation

	↑G start (go)

	↑P procede from break

	↑M  MAR trap; trap access on read, λ-bind or setq write;
		selectable in environment  ... see conniver
		this will  enter variable in MAR trap region of screen
		as protected register.


documentation
 screen oriented
  on-line, accessible through editor
    or indirect through atom

The essential ingredient here is to have help, aid, and comfort
available at the keystroke level. For example, at power up
the system can default to a "do you want a cheap tour" mode which
leads people by the hand through a simple  session. this probably
should be easily disabled because the text will take a lot of space
and will tend to irritate the experienced user. 

At any rate, documentation should be accessible online. This can be
done directly by overlaying with wordsmith, or by pointing at an atom
and smashing the help button/ "?" key. In this later case, a window 
will open containing the documentation (if any) associated with that atom.
User comments may be stored in this documentation area. The LISP editor
will recognize "comment mode" and stuff the material under the
appropriate atom's space. 

Checks should of course be made to make
sure the documentation is consistent with the associated function.
There should be an unabtrusive way of asking the user if the documentation
should be updated if a chanege is made to the function definition, for
example.

virtual memory
 floppy/hard memeory
  hardware support
   segmented procedures
   data areas


interface
 hardware
  screensplitter
  sail keyboard (?)
  mouse (?)

 software
  menu-driven
  control keys
   
The compiler, editor, and debugger are all written in lisp.  The  strategy
should be to consolidate what is known, leaving invention for version 2.

March: Intensive search, refinement, and discussion of features for editor
       and debugger.  Timing tests for VM schemes and feasibility of fancy
       storage managers  (think dual  processor?  --on-board  gc).  
	
       Examine pseudo-code/target  machine  hacks;  need  balance  between
       porting and speed with speed critical for version 1.

April-
August:Several activities  can occur  in parallel,  however may  be  worth
       having two machines available so low-level VM-interface development
       doesn't  disturb  systems  development  and  documentation   phases
       (tempers will get short!!)  Basically write code with BOTH hands!!


Sept:  Intensive QA, and as much guinea-pigging as possible.

For our effort, we  will have the best  micro-editor available; with  some
very interesting hardware coming over  the hill. The hardware  integration
and support will  be done  by a  very competent  group, knowledgeable  and
SYMPATHETIC to lisp.

It is difficult to judge how much effort is involved in each piece,  since
we're not  building pieces;  that's  why the  March  effort is  of  utmost
importance. Given that we find the proper "seams", I'd guess the following
relative order of difficulty and importance:

 AREA	    DIFFICULTY     IMPORTANCE
compiler	1.	      .3
virtual mem	.95	      .7
debugger	.7	      1.
editor		.5	      1.
documentation	.2	      1.

Midstream (about June 15)  in the development of  version 1 we must  begin
establishing the structure  of version 2.  That should either  be our  own
processor or a  new micro  processor. Actually both  strategies should  be
pursued; we should  be prepared to  support lisp on  any processor.   This
venture will require more funds and full-time effort.

byte questions
 average hobbiest has?
 idea about selling price?

luser questions
 what you got
 what you'd pay

finish byte article

cover letter for patterson

contract
 we retain software
 they market
 percent


hardware
  screensplitter
  keyboard
  big floppy
  memory

byte book
     evaluators
      state machines
      tail recursion

     storage management
      cdr-codes
      virtual memory
      garbage collection

     compilers
       
     parsing
      recursive descent
      pratt

     games
      alpha-beta
       

santa clara/mr
  lcf
  ai
  macsyma
  lisp

We propose to specify and develop  the industry standard LISP much in  the
spirit in which UCSD has  cultivated Pascal. This effort involves  several
phases:

1. CULTIVATE  THE MARKET.   My book,  Anatomy of  LISP, has  already  been
successful at the university and research end of the LISP spectrum.   What
is needed is a comparable impact  at the hobby and intermediate level.   I
have begun the preparation of that market segment with the March 1979 BYTE
editorial as a lead-in  to more detailed BYTE  articles.  The response  to
the editorial has been amazing.  As a result of the expressed interest,  I
have begun a LISP Users/Interest Group which will be announced in the July
BYTE magazine.  The August BYTE will be a special issue dedicated to  LISP
and its applications.  The  interest in that enterprise  has been so  high
that BYTE will present the overflow papers in the following month's issue.

2. IMPLEMENT  A HIGH  QUALITY  LISP.  Currently,  there are  several  LISP
dialects available  on the  larger machines;  all have  certain  anomalous
features, mostly due  to historical  reasons. Our  design is  a clean  and
generalized LISP  which will  fit within  the confines  of a  contemporary
eight-bit micro, but will expand gracefully to the forthcoming sixteen-bit
machines.  We perceive  the major  failure in  other (non-toy)  micro-LISP
implementations to be an inadequate  concern for the human interface.   In
particular, our  LISP  will  NOT  run in  a  hardcopy  or  glass  teletype
environment. I have  been associated with  the Stanford AI  Lab since  the
days of  the  PDP-1,  being  part  of the  original  design  team  of  the
display-based AI system, then a  research assistant, and finally  research
associate.  As  a result  of this  experience, I  believe that  there  are
certain aspects  of  interaction  which  must  not  be  compromised  if  a
successful and useful system is  to result. We understand the  limitations
of  micro-computer  implementations,   but  believe  that   most  of   the
conservative decisions of contemporary systems design stem from a lack  of
knowledge of what can be done, rather  than a lack of function within  the
processors; witness the  fact that  the display philosphy  of Stanford  AI
system was  developed on  the PDP-1  with a  10 microsecond  4K-by-18  bit
memory for all program and data.  Therefore our LISP design is premised on
display interaction: a display editor which knows about list-structure,  a
display debugger which  knows about  a LISP architecture,  and an  on-line
documentation feature  which will  always  be available  to the  user  for
documenting new functions  or querying  the system about  the behavior  of
existing functions.

Our plans have been  based on the availability  of three North  Star-based
development systems promised to us  by Micro Diversions, the  manufacturer
of the ScreenSplitter. Though promised six weeks ago, Micro Diversions  is
not able to supply the systems;  therefore we are looking elsewhere.   The
major attractions  of Micro  Diversions were  their interest  in LISP  and
their ScreenSplitter, a 40-line by 86-character S-100 video interface card
with an onboard  PROM to  specify and  manipulate user-defined  "windows".
These windows are rectangles which contain text and may be manipulated  as
independent screens, complete with their own scrolling discipline, even to
the extent of  moving the  rectangles around  on the  screen.  Though  not
perfect, the  ScreenSplitter  made a  very  attractive base  on  which  to
develop our software.  As things stand now we will either produce our  own
version of the ScreenSplitter, buy  quantities of the ScreenSplitter  from
Micro Diversions, or look for an alternate vendor.

3. PRODUCE  A QUALITY  TEXTBOOK.   Education is  a critical  component  in
building a market for  LISP and its applications;  the experience of  UCSD
and their Pascal attests to that. Therefore  I am writing a new LISP  book
which will use our LISP as the programming dialect and will reinforce  the
programming style  which makes  LISP so  powerful.  The  book will  be  an
intermediate point between that of the "Anatomy of LISP" and the  current,
rather disappointing offerings, at the  LISP programming level. I plan  to
maintain control of the publication myself. After dealing with McGraw-Hill
throughout all phases of the publishing process, I am confident that I can
do no worse myself. I  will create the text,  contract for a copy  editor,
supply a  compositor  with  floppies which  contain  the  formatted  text,
contract for the binding and production, and will do my own marketing.


4.  EXPLORE THE MARKET POTENTIAL.  Clearly, the market for unadorned  LISP
machines is limited. LISP is a tool; I  do not wish to be in the  business
of selling tools. The real markets  for LISP are those products which  the
"LISP tool" allows us  to build:  natural language  access to data  bases;
more general and  flexible data base  organizations; intelligent  systems,
called "expert systems" which can interact with the user in a  non-trivial
dialog;  "calculators"  which  perform  non-numerical  mathematics,   like
MACSYMA  and  REDUCE;  intelligent   control  of  real-time  devices   and
instruments.  Clearly,  all  of  these  applications  can  be  written  in
languages other than  LISP; however,  it has  been the  experience of  the
twenty years  of AI  research that  LISP allows  more flexible  and  rapid
development of such complex programs. The next revolution in the  computer
industry will be  in software, not  hardware.  Those who  survive will  be
those who understand software  and are able  to captialize on  techniques,
like LISP,  which  give added  leverage  in the  construction  of  complex
software.



WHAT WE NEED.  There  are three major participants  in the current  phase;
each of  us needs  a development  system.  The  system promised  by  Micro
Diversions consisted of: 1) a North Star based microcomputer with at least
48K of ram, 2) two North  Star mini-floppies, 3) the ScreenSplitter, 4)  a
22MHz 15" P39 Motorola monitor, and 5) a Keytronix keyboard (these systems
were to be provided to us on a buy-back basis.)  The only component  which
is missing in this configuration is  a hardcopy device; that must also  be
supplied.

Our major concern  now is  to secure  comparable systems  from a  reliable
supplier; we do not wish to get  mired down in a "hardware swamp" or  have
our software efforts blunted by inadequate development software. Our major
constraint is that  the system support  a ScreenSplitter-like device;  the
mini-floppies are  a  weak constraint  since  the BYTE  readership  survey
reveals that these devices  are the predominant  mass storage device;  the
8080/Z80 is a strong constraint since  our current software is 8080  based
and the 8080/Z80 is owned by 56% of the BYTE audience.



WHAT WE OFFER.  Our aborted  negotiations with Micro Diversions  specified
that we would retain the software  rights to the systems which we  develop
and we would license  Micro Diversions to market  our software in a  joint
venture with an  as yet unspecified  division of the  profits between  the
participants.   We   will  offer   comparable   agreements  in   our   new
arrangements.

A FINAL NOTE. We view  this micro-LISP project as  the initial phase of  a
much larger  venture.   None of  us  are business  oriented  by  training;
therefore we feel that it is most important that we shake out the  details
of such business  plans at this  low-risk level before  entering into  the
real business of LISP and applied Artificial Intelligence. (Our experience
with Micro Diversions confirms this caution).  Furthermore, none of us  is
particularly wealthy; that is the only reason for offering our project  to
outside concerns. Each of us is  willing to contribute their savings as  a
sign of good  faith and dedication.   Moreover, I personally  have made  a
commitment to see this project  to completion; that commitment involves  a
substantial loss of salary.  If we cannot come to an acceptable  agreement
with an outside source almost  immediately, I will break off  negotiations
and make further personal sacrifice to assure the success of this  venture
without external  funding.  In  view of  that possibility,  we would  look
favorably on  a  source who  could  instantly supply  three  appropriately
equipped machines at an attractive  price.



A description of the stages in the project, with costs and return.
__________________________________________________________________________
Phase 0: Supply a high-quality micro-LISP programming environment to fit
     within an existing S-100 8080/Z-80  system. This system supplies  a
     hardware module equivalent to the ScreenSplitter, and LISP software
     consiting of a interpreter, structure editor, display debugger, and
     on-line documentation system. The  system may include a  simplified
     virtual memory system  which will use  a mini-floppy.  The  minimal
     configuration requires 32KB.  We will supply a pseudo-code compiler
     and assembler/interpreter for systems with 64KB.

     This Phase is supported on a part-time basis except for  John Allen 
     who is dedicating full-time to its completion.
 
 Time scale: This Phase is already underway and will be in a sufficiently
     completed stage that we can feel justified in announcing its avail-
     ability  in the August BYTE (the deadline for space reservation for
     that  issue  is  June 9,  with camera-ready copy to them within two
     weeks.)

 Expenses: 300 ScreenSplitters at $275 (quantity prices)	$82,500 
	   3 development machines				 12,000
	   Advertizing						  3,000
	   Supplies: (paper, discs, phone, misc)		  1,000
								_______
 Total expenses:						$98,500

 Revenues: 300 systems at $500				       $150,000 

 Return on investment: (34% plus machines)		        $51,500 


Phase 1: Supply a total integrated system based largely on the software
     of  Phase  0,  but  including  our  own  display  processor, key-
     board, processor, memory and mass storage.   This machine will be
     attractive in the hobbist market, the educational market, and be
     acceptable to low-end business applications which stress interact-
     ive behavior rather than massive computation.

     This phase is a much more agressive program both in terms of capital
     and payoff. By this time we should have both a full-time  and  part-
     time staff.

 Time Scale: This Phase should begin the planning stage about June 15, 
     germinate through the final stages of Phase 0, and go into full-
     scale operation so that it can be available around Feburary 1,1980.

 Expenses: Salaries 						$100,000
	   Facilities:						  25,000
	   Development machine				 	  12,000
	   Advertizing						   7,000
	   Travel						   5,000
	   Supplies: (paper, office supplies, phone, misc)	   3,000
								________
								$152,000

 Components cost for 100 systems:
						Processors	  10,000
						Display systems	  50,000
						64KB memory	  50,000
						Mass storage	  50,000
						Keyboards	   7,500
								________
								$167,500

 Total expenses:						$319,500

Revenues on 100 systems at $7,500:				$750,000

Return on investment: (57% plus machine)			$430,500
   


Phase 2:   These systems  represent  very respectable  LISP  processors,
    capable  of  supporting  full-scale  business  applications.   These
    machines will be cabable  of supporting both  symbolic  and  compu-
    tational applications in applied Artificial Intelligence.
    Note that  there  is a  wide  equipment (and  therefore  price)  gap
    between Phase  1  and  Phase  2  machines.   Clearly  we  will  have
    offerings which lie in between these extremes.

 Time Scale: This Phase should begin the planning stage about August 15, 
     with the planning stage extending to January 15, and the production
     cycle completing around June 1, 1980.

 Expenses: Salaries 						$200,000
	   Facilities:						  30,000
	   Development machine				 	  12,000
	   Advertizing						   7,000
	   Travel 						  12,000
	   Supplies: (paper, office supplies, phone, misc)	   6,000
								________
 Total expenses							$267,000

 Components cost for 50 systems:
						Processors	  10,000
						Display systems	  25,000
					  	256KB memory	 100,000
						Mass storage	 120,000
						Keyboards	   5,000
								________
								$260,000

 Total expenses:						$527,000

Revenues on 50 systems at $27,000 each:		 	      $1,350,000

Return on investment (61%):				        $823,000



The specification  of a  powerful micro-LISP  involves criteria  different
from those which dictated the  form of LISP 1.5  on the IBM704.  Much  has
been  learned   about  language   design  in   the  interim;   programming
environments  have   changed   from  batch   processing   to   interactive
development; and, the general architectural differences between the IBM704
and the Zilog Z-80 are substantial.  Therefore it is necessary to  specify
a LISP which in substance, agrees with LISP1.5, but differs in  peripheral
areas and is improved and unique (particularly in the micro-LISP domain).



General Characteristics of the LISP
____________________________________________

All features commonly know as "pure LISP"
car, cdr, cons, atom, eq, list, null, cond, quote
and, or, not, 

the interpreter and its subsidiary functions
eval, apply

function definition facilities
de, df

error handling 
errset, err, trace, break, untrace, unbreak

list modification functions 
rplaca, rplacd, nconc, delete





iterative LISP (the prog feature)
prog, go, set, setq, return

property-lists and their associated operations
get, getl, putprop, remprop


input/output facilities compatible with a serial terminal interface
read, readch, ibase, obase, intern
print, princ, prin1, terpri, ascii, cascii

general auxiliary functions
equal, subst, append, copy, reverse, member, memq, length,
gensym, select, gc, remob


functional-related functions (w.o. free variables)
function, mapcar, map

numeric functions (integers only)
plus, minus, difference, times, quotient, remainder, add1,
sub1, max, min, lessp, greaterp, zerop, minusp, numberp

This is not an exclusive list, only an indication of those functions which 
must be present in an effective implementation.


General extensions and improvements over LISP1.5
____________________________________________
Macros and read macros, defined by dm and dmc
These are powerful language features, added to LISP after 1.5;
excellent for data abstraction and language extension.

Structured iteration
while, repeat, do, escape, etc.
These structured iterations will give added clarity to LISP  programs and
will tend to discourage the use of the ancient prog-feature.

Small integers
This is an improved technique for storing integers, giving faster arithmetic
than 1.5.

Autoload
This is a MACLISP feature which will allow better utilization of main memory
by storing low-usage function definitions on the disk.

Strings
Most modern LISP's supply some string manipulation facilities. We will
supply "characters" and "string" data types, along with their appropriate
manipulative operations.

Shallow binding
This is an improved technique for storing (and accessing) the bindings of
LISP variables. It offers increased speed for variable references.




Improved I/O
"pretty-printing" of input and output to improve the readability of LISP
expressions. Output can be printed in an "ellipsed" form.
Certain display-related primitives will be  supplied, allowing cursor-addressed
output of data.
General access to the operating system's file structure will be provided,
with functions to read and write LISP data to and from files.

Again, more improvements may be forthcoming and implemented subject to approval
of both parties.

General omissions from LISP 1.5
____________________________________________
Punch card and operating system anachronisms
Should need no explanation!

Floating point arithmetic
Given the  instruction set  of the  Z-80,  and the  usual domain  of  LISP
programs, floating point arithmentic is a non-critical feature.

Functional arguments with free variables
A logically  complete  treatment  of  functional  arguments  includes  the
mechanisms to  handle  the  occurrence free  variables.  Those  mechanisms
involve a significant amount of overhead which must be incurred throughout
the implementation. Since these  uses of functional  arguments tend to  be
anomalous rather  that  common,  most implementations  refrain  from  this
generality. We therefore will  follow suit, opting for  a LISP which  will
execute rapidly.


optional features
____________________________________________
Arrays

arbitrary precision arithmetic

pretty-printing structure editor

interactive debugger (above trace and break)

bank-switching for larger list space

compiler and assembler

knowledge engineering
--------------------------------------------
general characteristics:
	large, high performance systems with critical human interaction times
	  based on user expectation or real-time systems, driven by on-going 
	  external events.

examples: 
	natural language processing (analysis and understanding)
	computer vision
	symbolic computation(macsyma, program manipulation systems)
	computer aided manufacture and design

interesting characteristic: majority of  such are lisp-based.
	why? characteristics of language as being good at symbolic manipulation
	   apply both in the algorithms of the final program and in the program
	   development environment which contains lisp.

the goal:
	to develop cost-effective lisp machines to replace the existing hardware
	   and to develop less-expensive hardware to attract new applications
	   of lisp-based systems.  These new applications range from  small business
	   applications to the educational market of hobby computation.
--------------------------------------------
    intelligent systems (inlat)/mit-rts/expert systems
--------------------------------------------
constraints
    organizational structure of complex exploratory  investigations
	guide development by specifying what is known in a fashion
	that a system can use it


An interesting change  is occuring  at the frontiers  of language  design,
artificial intelligence applications  and computer  aided design.   People
are beginning to consider system description as a viable means for driving
system prescription.   The  underlying threads  in  all of  these  diverse
applications are:

     1) the complexity of the systems is surpassing the understanding  and
     traditional programming ability of the constructors; program segments
     which are locally correct are later found to be inconsistent in their
     larger environment.

2) the systems designer is able to specify much of the expected behavior
of the sytem, both in terms of what it can do and what it cannot do.

The interaction of these ideas has given rise to a program development
approach called "constraints".

    compare with procedural/declarative fracture
	e.g. axiomatic vs. pl.

    manipulation of constraints ala logical implications: "propagation" 

    relate to truth maintenance  and "dependency analysis"

    states must contain reasons for their existence and  be able to explain
	their existence.

    addition of information must be justified and  must recognize occurence
	of contradictory information

    applications to cad

    applications to general programming problem
--------------------------------------------
    cad/gls/icarus
applications of constraints and truth maintenance

--------------------------------------------
    cam (computer aided manufacture) samet/reiger

	intelligent control of tools implies real time symbolic computation

	decision making and task planning in changing environment (uncertainty)

	programs reasoning about programs; program  modification

	a natural for interactive development

	running environment is also model of interaction: now between machines

	progam elements:
	  large shared data base of facts and processes (tools and applications)
	  planning strategies (in development phases and in repair mode)
		alternative worlds
		deduction inference
		goal directed/associative processes

	  flexible: program =data
		    dynamic storage
		    multiprocess 1) as control of prog; 2)as planning strategy

-----
on flexibility and other factors:
the production process is a network of flow of parts and processes.
it has many alternative parts, each is subject to breakdown; therefor
a premium is placed on real-time correction and modification

the controller must be able to recognize flawed parts, terminate activity,
subject to repair or rejection.

therefore we have an intelligent distributed computation system which must
be ultra-reliable (even to the point of self-repair) and which works in an
environment of uncertainty.
-----


	why lisp: is an assembly language for complex tasks with nothing built in
		bout application
		  integrated programming environment is important for rapid development
		of complex tasks (emphasis on flexibility)
		compiler/editor/debugger/sdio are tools

		  more a.i. specific languages  have been built on lisp
		    micro-planner data bases with active (procedure elements)
		    	leading to pattern directed invocation

		    conniver more general control (useful in planning and repair
		      phases) more general world modeling for hypothetical
			 reasoning
		     later: qlisp, krl, constraints languages


--------------------------------------------
    robots   (random fact: justified if cost < four man-years labor cost)
    macsyma
--------------------------------------------
    expert systems
An expert system is a computer-based system which  knows about a 
specific domain and is able to perform with competence approaching that on an 
expert. Such system are becoming available as a result of recent research in
ai. At least two factors  have encouraged such development: first, there is a
limited supply of expertise in most technical areas; that is not  a recent 
development, but modern communications television in parrticular) have 
made it clearer to a wide audience that part of the world has  access to expertise
substantially superior to that which they enjoy. Since "natural" expertise
is a slowly grown commodity, "artificial" expertise becomes attractive.
For example, assume we have managed to capture a fair portion of the diagnostic
intuition of a renown physician, and have  encoded that on a machine along with
a flexible data base which contains the most modern of results in internal medicine.
Such a package would be of significant value to many of the world's physicians,
not as a device to replace them but as a helpful assistant.

The combination of ai research and modern computer technology  are making
such expert systems a reality.

Management information systems which  understand
dendral
mycin
randy
lenat

--------------------------------------------
    natural language/db
The more general setting in which  we should expect to find the expert system
is that of a combined natural language-data base-deductive facility.
Again, such systems are the direct result of ai research coming to  maturity.

--------------------------------------------
    general programming
One should not forget that LISP is also a general purpose programming language with
power superior to that supplied by most other languages. A modern LISP contains
all of the programming conveniences of a high level language, excelling in its
uniform treatment of data structures and its programming environments.
As a superior general purpose language, it is therefore admirably applicable to
complex non-ai applications. We name two:

1. 
    tex/books/xgpsyn
The difficulties of document production are well known: produce the
manuscript, correct and revise it, transfer it to a publishing media,
correct and revise the galleys, and iterate the process.  There is a general feeling
that the process can be aided immensely  by computers; if the process begins
on a computer using a text editor the creating and revision phase can be
improved; if that system  also contains document formatting programs complete
with type-setting conventions then the second phase can be improved.
What is needed is an inexpensive and comprehensive document preparation and 
publishing system which allows the author to control all phases of the 
publishing process including if necessary the actual publishing of the
document. Several experimental systems exist, the most  ambitious of which
is the TEX system of Don Knuth. 

The effort here is to integrate the creation and production phases of TEX
to take advantage of the high resolution display system and improved performance
of the speciallized hardware.

2.
    technology transfer
One area of concern related to lisp-based prototypes is the transfer of these
systems from the development phase to a production and maintenance  environment.
In this latter stage, personnel is mostly concerned with ultra-reliability,
portability, and ease of minor modification. One  option is to affect the
transfer through a translation of the system to a PASCAL or ADA-like language,
thereby offering (perhaps) a wider distribution  both in terms of the
number of machine which will offer ADA as well as making the programs
accessible to those who understand PASCAL/ADA  but feel no inclination
to  understand LISP. Of course, not all application software is amenable
to such a translation process; there are some LISP programs which will not
translate into the confines of a PASCAL-like  language. However, those
programs whose development depended on LISP's  superior programming environment
but whose execution characteristics are straightforward computations,
are succeptible to such a translation.

3.
ada and  program  development systems  
An area, closely related to the technology transfer is the development  of
the programming environment for an ADA-like language. Given that we are developing
lisp hardware, and given that  the prototype AI programs are to be transferred
to the ADA community, then we should consider the possibility of  merging these
two endeavors.  This merger is  possible and indeed highly profitable.
We propose the development of an ADA-like language as an extension of the
programming base supplied by LISP. The following paragraphs outline the
rationale for this approach.

The appropriate perspective with which to view a modern LISP is as a "high-level
machine language"; it has the power and flexibility of a traditional machine
language, supporting all the low-level detailed data descriptions which
assembly languages offer, yet its language elements are of a character similar
to those supplied by a high level language.  It shares a further very important
trait with machine language: its programs are represented within the LISP machine
as data objects, thereby being accessible to  manipulation by other programs.
This program-data duality is the critical distinction  which gives LISP its
power in the development phases; editors, debuggers, and other program 
manipulation systems are able to freely operate on the internal representations.

Of course this freedom can be misused, even unintentionally by a well-meaning
by unskilled user.  This misuse is one of the traditional arguments against
machine language and for high level languages: high level languages help
protect the unware and control program complexity.  Traditionally, the
convenience of a high level notation was purchased at the cost of a lost of
flexibility; a very severe translation phase was introduced which converted
the high level notation to machine code. This process results in severe debugging
problems in relating  the high level external language  with the low level
executable code. The existence of a machine which directly executes LISP
mitigates these problems. Assume an external PASCAL/ADA syntax; we supply
a parser-unparser program which performs the  translations between the external
syntax and LISP. Indeed, the main function of contemporary parsers is to 
translated the external syntax into a LISP-like parser tree, only now
that tree structure is directly executable code.  The debuggin problem becomes
much simpler since there is a close relationship with the executable LISP code
and the external syntax; that relationship can  be made explict through the
unparser. The program development personnel can use either the internal LISP
representation or, if convenient, the external syntax.  In either case,
when the programming is stabilized the external program form can be made
exportable with the assurance that it is equivalent to the internal LISP
program.


As currently proposed in the sandman document
the programming environment is  sadly lacking the the conveniences which LISP
programmers have come to expect. An elegant and useful demonstration of the 
power of LISP-like languages would involve the implementation of an ADA-like 
language within the LISP environment. Besides developing a timely implementation of
the language we would also demonstrate the power of LISP as a development tool,
using a parser-unparser technique to translate the external syntax into
the LISP internal form  and to retransmit internal forms to the user in the
external syntax. All programming development would appear to the user in
the external systax but the programming system would make good advantage
of the superior LISP graph-structure internally.

--------------------------------------------
    mr/structured-based system/constraints
The combination of the LISP philosophy and availablility of quality 
LISP machines, gives us the tools for finally making progress on the problem
called the "software crisis"



--------------------------------------------
    logo/education/smalltalk
--------------------------------------------
    research/texas/ti
--------------------------------------------
General reasons for a language based on lisp

1) what a language should have for real-time symbolic computation
  data structure flexibility; error checks and recovery in on-line running
	code.

  program≡data
	a. for macro facility: manipulate program and then execute it
	b. editing and modification
	c. self repair

  big virtual memory and file system(?)
	VM removes much burden from user in formulating problems
	VM gives added capability in building data bases
	VM can take over many functions of file system

  output is graphically oriented (display based systems)

  strong interrupt system and process control

  excellent software tools  (editors, debuggers, extensible parsers and unparsers
	 compilers)

  large efficient data base(s)

  generalized control structures

  constraints in program development phases

--------------------------------------------

There is a growing  awareness that the quality of computation must
be improved; computer systems are expected to be late, bug-ridden
and difficult to maintain. Part of the difficulty stems from the growing 
complexity of the systems we are building, but that is a symptom not a
cause. 

A popular contemporary solution to the tools problem is being advocated
by the structured programming school. They suggest that the difficulty lies in the 
lack of discipline within the programming process. The solution is to
educate the programming community to follow certain programming styles,
making explicit all information about the objects being manipulated
and elaborating the algorithms which manipulate those objects
with information about "what" is being computed as well as "how" the
computation is to be done. The methodology includes programming languages
which enforce this discipline. 

Though the goals are admirable, we have serious difficulties with the 
approach.
...mumble, mumble...

Our solution is based on the following  observations:

1. Programming is difficult. Effective programming requires a combination
of clever people, flexible tools, and appropriate training. In effect,
programming is no different than any other profession. As such, it requires
a comparable professional attitude and understanding.

2. The exemplar tools builders of computer science have been the artificial
intelligence community. The computer systems which form the backbone of
AI research have been among the best in the world. The excellence
of their tools is necessitated by the complexity of the problem domains
which they have been investigating. AI problems are not well-suited
to a "disciplined" programming approach. There are no a priori algorithms
for the phenomenon which AI studies; usually the only complete statement
of the solution is the resulting large, complex program.

3. Modern software systems are approaching the complexity of traditional
AI investigations. Distributed operating systems, information retrieval
systems, and even the development systems for CAD and hardware testing
are  growing in sophistication and complexity.


4. Inexpensive micro hardware is being developed whose power surpasses
current mini computers, and mainframs of a few years ago.
This is a double edged  sword: the hardware will allow even more
complex systems to be built, further exaggerating the software development
problem. On the other side, the hardware availability also offers a solution 
to the problem; it is now financially attractive to place the AI tools
in a wider user community, giving them the added levrage in solving the
problems.

lr
  cdc/pdp-10
  not worth it

advertised vs reality
  already spelled out in oct
  80x24

alpha
  late
  hobby machine

purchasing  dept


lack of progress
  6 mons salaries
  "from 7th to 2nd"

danray

8048/lisp
  not professional

general lack of management

a two-stack non-recursive version of vlisp

send: 	 val is in hl

receive: look at hl

next:	 push val,ns
	 push name,ns


        DEEP			SHALLOW
link:	push env,cs		push dest,cs
	env ← sp		dest ← sp
				while names ≠ null swap name(i) with cval(i)

unlink  pop env,cs		while names≠null swap name(i) with cval(i)
	sp ← env		pop dest,cs
				sp ← dest

lookup  usual			usual



eval:	isconst(exp)   	push exp,ns;pop cont,cs;jump loop

	isvar(exp)	lookup(exp);push val,ns;pop cont,cs;jump loop

	t		push fun,cs
			push arg,cs
			fun ← func(exp)
			push 'ev1,cs
			cont ← 'eval1
			jump loop
		

loop:  mung display if needed; pc←cont

ev1:	pop arg,cs
	pop fun,cs
	pop cont,cs
	jump loop


eval1:	atom(fun) 	lookup(fun)

	;drops through
	linker(fun)	jump@ fun
	
	t		push exp,cs
			push 'ev3,cs
			exp ← fun
			cont ← 'eval
			jump loop

ev3:	pop exp,cs
	push fun,cs
	push 'ev4,cs
	fun ← val
	cont ← 'ev4


ev4:	pop fun,cs
	pop exp,cs

----------------------
here's a "typing function" which will  make a lambda linker:

mk_λ_link:
	linkspace[i]←cdr(exp)     <((xi, x2, ...xn) body))>
	linkspace[i+1] ← "lhld *-1"
	linkspace[i+2] ← "jmp link_λ"
	push 'linkspace[i+1],cs   <new code for every such λ>
	i←i+3;
	cont←'evalargs

link_λ:	while vars≠null ∧ tos≠"marker"
		(x[i] ↔ cval[i]
		 incr[i])
		push ns,cs
		push "marker",ns
		push args,cs
		push 'ev6 cs
		args ← body
		cont ← 'evalargs
		go loop

ev6:	pop args,cs
	while tos≠"marker"
		(cval[i]←x[i]
		 incr[i])
		pop ns,cs
		pop cont,cs
		go loop

evalargs:
	null(args)	pop cont,cs; 
			jump loop

	t		push exp,cs
			exp←car(arg)
			cont←'ev9
			jmp loop


ev9:	null(cdr(args))	push 'ev10,cs; 
			cont ← 'eval; 
			jump loop

	t		push 'ev11,cs; 
			cont ← 'eval; 
			jump loop

ev10:	pop cont,cs; 
	pop exp; 
	jump loop;

ev11	push ?
	args ← cdr(args)
	exp car(args);
	cont ← 'ev9;
	jump loop;

data -driven compiler


e.g.  to compile while

atom for while has
[compile . while_compile]  property pair


(COMP EXP DEST SLOTLIST)
      (COND ((ATOM EXP) (COND ((CONST EXP) (COND ((SLOTTED CONST))
						 (T (EMIT EXP DEST)
						     (SLOT CONST DEST))))
			      ((VAR EXP) (COND ((SLOTED EXP))
					       (T (EMIT EXP DEST)
						  (SLOT EXP DEST))))))

	    (T((GET (FIRST EXP) 'COMPILE) (REST EXP) SLOTLIST))))


(EVAL EXP DEST ENV)
      (COND ((ATOM EXP) (COND ((CONST EXP) (SEND (DENOTE CONST) DEST)) 

			      ((VAR EXP) (SEND (LOOKUP VAR ENV) DEST))

	    (T((GET (FIRST EXP) 'EVALUATE) (REST EXP) DEST ENV))))


exp 	const
	var
	cond
	while
	prog1
	progn
	fn-call
	prim-call
	...

while 	<p> {<exp>}

cond {<p> <exp>}

ex. (to compile while  [gensym→L; 
			compile<p>→A; 
			compile[(cond ((null A) (go (gensym)→L1))]
			compile {<exp>}
			compile[(go L)]
			L1 ]


    (to evaluate while [gensym→L; 
			evaluate <e> →A;
			evaluate[(cond ((null A) (go (gensym)→L1))]
	*better*	if null A (go (gensym)→L1);
			evaluate {<exp>}
			evaluate[(go L)]
			L1 ]





(to compile cond (gensym→L)
	         {if null[cond]→L; exit;
		  compile <p>→A;
		  compile[if((null A) (go (gensym)→L1))]
		  compile <exp>
		  compile[(go L)]
		  L1
		  decr[cond]}
		  




(to evaluate cond { 	if null[cond] err[]
			evaluate <p>→A
			if A evaluate <exp>;exit
			decr cond}



(to evaluate 	fsubr jump @linker			eg (IF t1 t2 t3)

(to evaluate	subr	 length[subr] ≠ args[subr] →loss
			 if null[subr] jmp @linker
			 alloc_frame →dest
		      	 evaluate {<exp>}
			 link dest env
			 jump @linker


(to compile 	expr
		fexpr
		macro




(DEFUN FEXPR TO (L)
	(PUTPROP 

cheapy
1. mcgraw-hill book and consulting
2. lisp for hard disc from alpha
3. hardware from western digital
4. applications
5. mod2
6. plm's

This document  outlines  a  minimum  risk plan  to  develop  a  basis  for
artificial intelligence packages. It is based on an essentially one-person
part-time effort, each step presupposing  the existence (and success!)  of
the previous.

The corner  stone  of this  effort  is  the existence  of  a  satisfactory
"development" machine.  "Development"  is  meant  both  in  the  sense  of
document production  and  program  production.  The  initial  proposal  to
McGraw-Hill, dated Jan 7, 1979, asks for their support in purchasing  such
a system in return for the  preparation of an introductory LISP book.   In
essence, the argument to them states that LISP's popularity is growing and
that the introductory book  can perform a service  similar to that of  Ken
Bowles' book, "Problem Solving in Pascal".  Specifically, the  forthcoming
BYTE issues and "Anatomy of LISP"  make LISP details available to a  wider
audience.  The  interest  in   artificial  intelligence  applications   of
computing is already there; witness the BYTE readership; witness the heavy
investment by  TI  in  AI,  and  the  growing  concern  from  conservative
companies like HP.

The remainder of my argument to them involved my service to their staff on
the appropriate ways for a publishing concern to prepare for the computer.
The publishing  industry  is  archaic;  their  procedures  for  processing
manuscripts have only marginally been  affected by the computer.  Since  I
have been "through the mill" with them, I  feel that there is a lot I  can
tell them about modern document preparation.

A secondary  concern for  their  consideration was  my  offer to  build  a
demonstration document production  system within the  process of  creating
their text.  That  system  will have  significant  marketable  value  when
transported to a new generation micro-processor.

Basically the  offer  to McGraw-Hill  asks  for a  manuscript  preparation
grant, sufficient to develop facilities for their text. Once that text  is
produced, the  facilities become  mine. However,  while that  contract  is
running I expect  to pursue additional  avenues. First, the  text will  be
geared to the existence of a "standard" LISP system, much in the spirit of
USCD Pascal.  It should  be advantageous  for a  manufacturer of  personal
computers to possess  this LISP.  In particular, the  manufacturer of  the
development machine used in  the book production  should be interested  in
supporting my efforts; support should include hardware, software, and some
reasonable amount of money. In  turn, they would have  a jump on the  LISP
market.

Midstream in this project, a hardware effort should begin to settle on  an
inexpensive architecture for a  LISP machine; either  a L-code machine  in
the spirit  of  the  "Pascal micro-engine"  or,  depending  on  economics,
perhaps an interpreter running on one of the new microprocessors.

In either case, towards  the end of this  project sufficient capital  such
have accrued that applications of the document preparation tools, of LISP,
and of the hardware can begin  to surface. Completed packages should  come
first; they  will involve  applications  of AI  to business  and  personal
areas. For  example,  subsets on  natural  language for  communication  of
requests to  the machine.  As stated  eariler, the  more mundane  document
productions facilities can also be exploited.  Also, the LISP base can  be
exploited for building non-LISP applications;  since LISP is an  excellent
systems-building tool the speedy development  of languages such as  Pascal
and Modula dialects  should be possible.   Later the development  machines
can be marketed, but initially they should remain controlled.

This scheme is built on the gradual expansion of a controllable base.  Its
advantage is  clear:  except  for  the book,  all  other  efforts  can  be
terminated without financial  loss. The  scheme also  has weaknesses:   in
particular, it  is streached  out  over several  years, being  a  definite
part-time effort  at least  at the  inception. It  is not  clear that  the
maximally profitable window for such products will remain open this  long.
The technology is spreading rapidly; so there is unquestionable benefit in
being first with the combination of new software and new hardware.


grand
0. tools 11xZ
1. lisp
2. mod(2)
3. applications
4. plm's

This proposal assumes that sufficient capital is available immediately  to
begin the development  of AI  related computer  goods. As  with the  other
proposal, several interesting and profitable objects can be produced along
the way, but the trust of this  effort is the immediate production of  the
new systems.  The proposal assumes the immediate acquisition of a  machine
at least comparable to that  of the graduated proposal; better  facilities
would mean faster  development and  additional cost.  Careful study  would
need to be made of the trade-offs.

The major purpose of the development machine is to act as a software  base
to supply the  new processor.  Only the tools  which are  needed for  this
effort would  be developed  on the  host. The  idea is  to get  first-rate
software available on the new processor  as soon as possible. This  effort
 would probably  cover the  cost of  the development  effort all  by
itself, since it  may be  several years  before quality  software will  be
forthcoming from the manufacturers.

The software which is spunoff at this  time would be of high quality,  but
of a  rather  pedestrian  type:  perhaps Modula  (if  that  is  a  popular
commodity), surrounded  with  quality  program  and  document  development
tools. This effort can be encapsulated and maintained by a suparate branch
while development  continues  on  capturing  the  artificial  intelligence
technology.

The major goal is a transfer of artifical intelligence research into
the business and consumer marketplace. One particularly attractive  application
is  the packaging of an acceptable subset of natural language as a
computer query and command language. Within well-defined "expert domains"
such